Optimal rates of entropy estimation over Lipschitz balls

نویسندگان

  • Yanjun Han
  • Jiantao Jiao
  • Tsachy Weissman
  • Yihong Wu
چکیده

We consider the problem of minimax estimation of the entropy of a density over Lipschitz balls. Dropping the usual assumption that the density is bounded away from zero, we obtain the minimax rates (n lnn)− s s+d + n for 0 < s ≤ 2 in arbitrary dimension d, where s is the smoothness parameter and n is the number of independent samples. Using a two-stage approximation technique, which first approximate the density by its kernel-smoothed version, and then approximate the non-smooth functional by polynomials, we construct entropy estimators that attain the minimax rate of convergence, shown optimal by matching lower bounds. One of the key steps in analyzing the bias relies on a novel application of the HardyLittlewood maximal inequality, which also leads to a new inequality on Fisher information that might be of independent interest.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Modeling of the Maximum Entropy Problem as an Optimal Control Problem and its Application to Pdf Estimation of Electricity Price

In this paper, the continuous optimal control theory is used to model and solve the maximum entropy problem for a continuous random variable. The maximum entropy principle provides a method to obtain least-biased probability density function (Pdf) estimation. In this paper, to find a closed form solution for the maximum entropy problem with any number of moment constraints, the entropy is consi...

متن کامل

2 Almost Sure Limit Theorems for Expanding Maps of the Interval

For a large class of expanding maps of the interval, we prove that partial sums of Lipschitz observables satisfy an almost sure central limit theorem (ASCLT). In fact, we provide a speed of convergence in the Kantorovich metric. Maxima of partial sums are also shown to obey an ASCLT. The key-tool is an exponential inequality recently obtained. Then we derive almost-sure convergence rates for th...

متن کامل

A Two Stage k-Monotone B-Spline Regression Estimator: Uniform Lipschitz Property and Optimal Convergence Rate

This paper considers k-monotone estimation and the related asymptotic performance analysis over a suitable Hölder class for general k. A novel two stage k-monotone B-spline estimator is proposed: in the first stage, an unconstrained estimator with optimal asymptotic performance is considered; in the second stage, a k-monotone B-spline estimator is constructed by projecting the unconstrained est...

متن کامل

Minimax rates of estimation for high-dimensional linear regression over $\ell_q$-balls

Consider the standard linear regression model Y = Xβ+w, where Y ∈ R is an observation vector, X ∈ R is a design matrix, β ∈ R is the unknown regression vector, and w ∼ N (0, σI) is additive Gaussian noise. This paper studies the minimax rates of convergence for estimation of β for lp-losses and in the l2-prediction loss, assuming that β belongs to an lq-ball Bq(Rq) for some q ∈ [0, 1]. We show ...

متن کامل

Minimax Rates of Estimation for High-Dimensional Linear Regression Over q -Balls

Consider the standard linear regression model Y = Xβ+w, where Y ∈ R is an observation vector, X ∈ R is a design matrix, β ∈ R is the unknown regression vector, and w ∼ N (0,σI) is additive Gaussian noise. This paper studies the minimax rates of convergence for estimation of β for #p-losses and in the #2-prediction loss, assuming that β belongs to an #q-ball Bq(Rq) for some q ∈ [0, 1]. We show t...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • CoRR

دوره abs/1711.02141  شماره 

صفحات  -

تاریخ انتشار 2017